Bound on the Leave-One-Out Error for -SVMs

نویسندگان

  • Arthur Gretton
  • Ralf Herbrich
  • Peter J. W. Rayner
چکیده

A bound on the leave-one-out error for -support vector (SV) machine binary classifiers is presented. This bound is based on the geometrical concept of the span, which was introduced in the context of bounding the leave-one-out error for C-SV machine binary classifiers. It is shown that the bound presented herein requires less restrictive assumptions than the C-SV bound, and that the prediction of generalisation performance is improved, in particular for noisy data sets.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive Margin Support Vector Machines

In this chapter we present a new learning algorithm, Leave{One{Out (LOO{) SVMs and its generalization Adaptive Margin (AM{) SVMs, inspired by a recent upper bound on the leave{one{out error proved for kernel classiiers by Jaakkola and Haussler. The new approach minimizes the expression given by the bound in an attempt to minimize the leave{one{out error. This gives a convex optimization problem...

متن کامل

Leave-One-Out Support Vector Machines

We present a new learning algorithm for pattern recognition inspired by a recent upper bound on leave-one-out error [Jaakkola and Haussler, 1999] proved for Support Vector Machines {SVMs) [Vapnik, 1995; 1998]. The new approach directly minimizes the expression given by the bound in an attempt to minimize leave-one-out error. This gives a convex optimization problem which constructs a sparse lin...

متن کامل

A Quadratic Loss Multi-Class SVM

Using a support vector machine requires to set two types of hyperparameters: the soft margin parameter C and the parameters of the kernel. To perform this model selection task, the method of choice is cross-validation. Its leave-one-out variant is known to produce an estimator of the generalization error which is almost unbiased. Its major drawback rests in its time requirement. To overcome thi...

متن کامل

A Nonconformity Approach to Model Selection for SVMs

We investigate the issue of model selection and the use of the nonconformity (strangeness) measure in batch learning. Using the nonconformity measure we propose a new training algorithm that helps avoid the need for Cross-Validation or Leave-One-Out model selection strategies. We provide a new generalisation error bound using the notion of nonconformity to upper bound the loss of each test exam...

متن کامل

Radius-margin Bound on the Leave-one- out Error of the Llw-m-svm

To set the values of the hyperparameters of a support vector machine (SVM), one can use cross-validation. Its leave-one-out variant produces an estimator of the generalization error which is almost unbiased. Its major drawback rests in its time requirement. To overcome this difficulty, several upper bounds on the leave-one-out error of the pattern recognition SVM have been derived. The most pop...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007